Shoval Benjer id 319037404ΒΆ

Nadav Falkowski id 207446543ΒΆ

link to kaggle account - https://www.kaggle.com/shovalbenjer

TL;DRΒΆ

we Loaded and preprocess the dataset by filling missing values, encoding categorical variables, and applying Box-Cox transformation to the target variable. our intial model got optimal results than later results by restricted to linear and gradient decent. we Split the data into training and validation sets, Standardize the features and train a Lasso regression model using Leave-One-Out Cross-Validation (LOOCV). we Evaluate the model on the training set using RMSE, RΒ², adjusted RΒ², and MAE metrics.

image.png

Intro - House Prices - Advanced Regression TechniquesΒΆ

image.png Ask a home buyer to describe their dream house, and they probably won't begin with the height of the basement ceiling or the proximity to an east-west railroad. But this playground competition's dataset proves that much more influences price negotiations than the number of bedrooms or a white-picket fence.

With 79 explanatory variables describing (almost) every aspect of residential homes in Ames, Iowa, this competition challenges you to predict the final price of each home.

InΒ [28]:
!pip install optuna
!pip install scikit-optimize
Requirement already satisfied: optuna in /usr/local/lib/python3.10/dist-packages (3.6.1)
Requirement already satisfied: alembic>=1.5.0 in /usr/local/lib/python3.10/dist-packages (from optuna) (1.13.2)
Requirement already satisfied: colorlog in /usr/local/lib/python3.10/dist-packages (from optuna) (6.8.2)
Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (from optuna) (1.25.2)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from optuna) (24.1)
Requirement already satisfied: sqlalchemy>=1.3.0 in /usr/local/lib/python3.10/dist-packages (from optuna) (2.0.31)
Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from optuna) (4.66.4)
Requirement already satisfied: PyYAML in /usr/local/lib/python3.10/dist-packages (from optuna) (6.0.1)
Requirement already satisfied: Mako in /usr/local/lib/python3.10/dist-packages (from alembic>=1.5.0->optuna) (1.3.5)
Requirement already satisfied: typing-extensions>=4 in /usr/local/lib/python3.10/dist-packages (from alembic>=1.5.0->optuna) (4.12.2)
Requirement already satisfied: greenlet!=0.4.17 in /usr/local/lib/python3.10/dist-packages (from sqlalchemy>=1.3.0->optuna) (3.0.3)
Requirement already satisfied: MarkupSafe>=0.9.2 in /usr/local/lib/python3.10/dist-packages (from Mako->alembic>=1.5.0->optuna) (2.1.5)
Requirement already satisfied: scikit-optimize in /usr/local/lib/python3.10/dist-packages (0.10.2)
Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (1.4.2)
Requirement already satisfied: pyaml>=16.9 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (24.4.0)
Requirement already satisfied: numpy>=1.20.3 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (1.25.2)
Requirement already satisfied: scipy>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (1.11.4)
Requirement already satisfied: scikit-learn>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (1.2.2)
Requirement already satisfied: packaging>=21.3 in /usr/local/lib/python3.10/dist-packages (from scikit-optimize) (24.1)
Requirement already satisfied: PyYAML in /usr/local/lib/python3.10/dist-packages (from pyaml>=16.9->scikit-optimize) (6.0.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from scikit-learn>=1.0.0->scikit-optimize) (3.5.0)
InΒ [29]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
from scipy import stats
from scipy.stats import skew, boxcox
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LinearRegression, SGDRegressor, LassoCV
from sklearn.metrics import mean_squared_error
from sklearn.preprocessing import PowerTransformer, QuantileTransformer, LabelEncoder, StandardScaler
from sklearn.neighbors import LocalOutlierFactor
import optuna
from optuna.samplers import TPESampler
from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import RobustScaler
from sklearn.model_selection import LeaveOneOut
InΒ [30]:
# Load the dataset
train_data = pd.read_csv('train.csv')
test_data = pd.read_csv('test.csv')

# Display the first few rows of the dataset
pd.set_option('display.max_columns', None)  # Show all columns
pd.set_option('display.width', None)        # Prevent line breaks
print("Train Data Head:")
print(train_data.head())

# Display descriptive statistics
print("\nTrain Data Description:")
print(train_data.describe())

# Display data types and non-null counts
print("\nTrain Data Info:")
print(train_data.info())
Train Data Head:
   Id  MSSubClass MSZoning  LotFrontage  LotArea Street Alley LotShape LandContour Utilities  \
0   1          60       RL         65.0     8450   Pave   NaN      Reg         Lvl    AllPub   
1   2          20       RL         80.0     9600   Pave   NaN      Reg         Lvl    AllPub   
2   3          60       RL         68.0    11250   Pave   NaN      IR1         Lvl    AllPub   
3   4          70       RL         60.0     9550   Pave   NaN      IR1         Lvl    AllPub   
4   5          60       RL         84.0    14260   Pave   NaN      IR1         Lvl    AllPub   

  LotConfig LandSlope Neighborhood Condition1 Condition2 BldgType HouseStyle  OverallQual  \
0    Inside       Gtl      CollgCr       Norm       Norm     1Fam     2Story            7   
1       FR2       Gtl      Veenker      Feedr       Norm     1Fam     1Story            6   
2    Inside       Gtl      CollgCr       Norm       Norm     1Fam     2Story            7   
3    Corner       Gtl      Crawfor       Norm       Norm     1Fam     2Story            7   
4       FR2       Gtl      NoRidge       Norm       Norm     1Fam     2Story            8   

   OverallCond  YearBuilt  YearRemodAdd RoofStyle RoofMatl Exterior1st Exterior2nd MasVnrType  \
0            5       2003          2003     Gable  CompShg     VinylSd     VinylSd    BrkFace   
1            8       1976          1976     Gable  CompShg     MetalSd     MetalSd        NaN   
2            5       2001          2002     Gable  CompShg     VinylSd     VinylSd    BrkFace   
3            5       1915          1970     Gable  CompShg     Wd Sdng     Wd Shng        NaN   
4            5       2000          2000     Gable  CompShg     VinylSd     VinylSd    BrkFace   

   MasVnrArea ExterQual ExterCond Foundation BsmtQual BsmtCond BsmtExposure BsmtFinType1  \
0       196.0        Gd        TA      PConc       Gd       TA           No          GLQ   
1         0.0        TA        TA     CBlock       Gd       TA           Gd          ALQ   
2       162.0        Gd        TA      PConc       Gd       TA           Mn          GLQ   
3         0.0        TA        TA     BrkTil       TA       Gd           No          ALQ   
4       350.0        Gd        TA      PConc       Gd       TA           Av          GLQ   

   BsmtFinSF1 BsmtFinType2  BsmtFinSF2  BsmtUnfSF  TotalBsmtSF Heating HeatingQC CentralAir  \
0         706          Unf           0        150          856    GasA        Ex          Y   
1         978          Unf           0        284         1262    GasA        Ex          Y   
2         486          Unf           0        434          920    GasA        Ex          Y   
3         216          Unf           0        540          756    GasA        Gd          Y   
4         655          Unf           0        490         1145    GasA        Ex          Y   

  Electrical  1stFlrSF  2ndFlrSF  LowQualFinSF  GrLivArea  BsmtFullBath  BsmtHalfBath  FullBath  \
0      SBrkr       856       854             0       1710             1             0         2   
1      SBrkr      1262         0             0       1262             0             1         2   
2      SBrkr       920       866             0       1786             1             0         2   
3      SBrkr       961       756             0       1717             1             0         1   
4      SBrkr      1145      1053             0       2198             1             0         2   

   HalfBath  BedroomAbvGr  KitchenAbvGr KitchenQual  TotRmsAbvGrd Functional  Fireplaces  \
0         1             3             1          Gd             8        Typ           0   
1         0             3             1          TA             6        Typ           1   
2         1             3             1          Gd             6        Typ           1   
3         0             3             1          Gd             7        Typ           1   
4         1             4             1          Gd             9        Typ           1   

  FireplaceQu GarageType  GarageYrBlt GarageFinish  GarageCars  GarageArea GarageQual GarageCond  \
0         NaN     Attchd       2003.0          RFn           2         548         TA         TA   
1          TA     Attchd       1976.0          RFn           2         460         TA         TA   
2          TA     Attchd       2001.0          RFn           2         608         TA         TA   
3          Gd     Detchd       1998.0          Unf           3         642         TA         TA   
4          TA     Attchd       2000.0          RFn           3         836         TA         TA   

  PavedDrive  WoodDeckSF  OpenPorchSF  EnclosedPorch  3SsnPorch  ScreenPorch  PoolArea PoolQC  \
0          Y           0           61              0          0            0         0    NaN   
1          Y         298            0              0          0            0         0    NaN   
2          Y           0           42              0          0            0         0    NaN   
3          Y           0           35            272          0            0         0    NaN   
4          Y         192           84              0          0            0         0    NaN   

  Fence MiscFeature  MiscVal  MoSold  YrSold SaleType SaleCondition  SalePrice  
0   NaN         NaN        0       2    2008       WD        Normal     208500  
1   NaN         NaN        0       5    2007       WD        Normal     181500  
2   NaN         NaN        0       9    2008       WD        Normal     223500  
3   NaN         NaN        0       2    2006       WD       Abnorml     140000  
4   NaN         NaN        0      12    2008       WD        Normal     250000  

Train Data Description:
                Id   MSSubClass  LotFrontage        LotArea  OverallQual  OverallCond  \
count  1460.000000  1460.000000  1201.000000    1460.000000  1460.000000  1460.000000   
mean    730.500000    56.897260    70.049958   10516.828082     6.099315     5.575342   
std     421.610009    42.300571    24.284752    9981.264932     1.382997     1.112799   
min       1.000000    20.000000    21.000000    1300.000000     1.000000     1.000000   
25%     365.750000    20.000000    59.000000    7553.500000     5.000000     5.000000   
50%     730.500000    50.000000    69.000000    9478.500000     6.000000     5.000000   
75%    1095.250000    70.000000    80.000000   11601.500000     7.000000     6.000000   
max    1460.000000   190.000000   313.000000  215245.000000    10.000000     9.000000   

         YearBuilt  YearRemodAdd   MasVnrArea   BsmtFinSF1   BsmtFinSF2    BsmtUnfSF  TotalBsmtSF  \
count  1460.000000   1460.000000  1452.000000  1460.000000  1460.000000  1460.000000  1460.000000   
mean   1971.267808   1984.865753   103.685262   443.639726    46.549315   567.240411  1057.429452   
std      30.202904     20.645407   181.066207   456.098091   161.319273   441.866955   438.705324   
min    1872.000000   1950.000000     0.000000     0.000000     0.000000     0.000000     0.000000   
25%    1954.000000   1967.000000     0.000000     0.000000     0.000000   223.000000   795.750000   
50%    1973.000000   1994.000000     0.000000   383.500000     0.000000   477.500000   991.500000   
75%    2000.000000   2004.000000   166.000000   712.250000     0.000000   808.000000  1298.250000   
max    2010.000000   2010.000000  1600.000000  5644.000000  1474.000000  2336.000000  6110.000000   

          1stFlrSF     2ndFlrSF  LowQualFinSF    GrLivArea  BsmtFullBath  BsmtHalfBath  \
count  1460.000000  1460.000000   1460.000000  1460.000000   1460.000000   1460.000000   
mean   1162.626712   346.992466      5.844521  1515.463699      0.425342      0.057534   
std     386.587738   436.528436     48.623081   525.480383      0.518911      0.238753   
min     334.000000     0.000000      0.000000   334.000000      0.000000      0.000000   
25%     882.000000     0.000000      0.000000  1129.500000      0.000000      0.000000   
50%    1087.000000     0.000000      0.000000  1464.000000      0.000000      0.000000   
75%    1391.250000   728.000000      0.000000  1776.750000      1.000000      0.000000   
max    4692.000000  2065.000000    572.000000  5642.000000      3.000000      2.000000   

          FullBath     HalfBath  BedroomAbvGr  KitchenAbvGr  TotRmsAbvGrd   Fireplaces  \
count  1460.000000  1460.000000   1460.000000   1460.000000   1460.000000  1460.000000   
mean      1.565068     0.382877      2.866438      1.046575      6.517808     0.613014   
std       0.550916     0.502885      0.815778      0.220338      1.625393     0.644666   
min       0.000000     0.000000      0.000000      0.000000      2.000000     0.000000   
25%       1.000000     0.000000      2.000000      1.000000      5.000000     0.000000   
50%       2.000000     0.000000      3.000000      1.000000      6.000000     1.000000   
75%       2.000000     1.000000      3.000000      1.000000      7.000000     1.000000   
max       3.000000     2.000000      8.000000      3.000000     14.000000     3.000000   

       GarageYrBlt   GarageCars   GarageArea   WoodDeckSF  OpenPorchSF  EnclosedPorch  \
count  1379.000000  1460.000000  1460.000000  1460.000000  1460.000000    1460.000000   
mean   1978.506164     1.767123   472.980137    94.244521    46.660274      21.954110   
std      24.689725     0.747315   213.804841   125.338794    66.256028      61.119149   
min    1900.000000     0.000000     0.000000     0.000000     0.000000       0.000000   
25%    1961.000000     1.000000   334.500000     0.000000     0.000000       0.000000   
50%    1980.000000     2.000000   480.000000     0.000000    25.000000       0.000000   
75%    2002.000000     2.000000   576.000000   168.000000    68.000000       0.000000   
max    2010.000000     4.000000  1418.000000   857.000000   547.000000     552.000000   

         3SsnPorch  ScreenPorch     PoolArea       MiscVal       MoSold       YrSold  \
count  1460.000000  1460.000000  1460.000000   1460.000000  1460.000000  1460.000000   
mean      3.409589    15.060959     2.758904     43.489041     6.321918  2007.815753   
std      29.317331    55.757415    40.177307    496.123024     2.703626     1.328095   
min       0.000000     0.000000     0.000000      0.000000     1.000000  2006.000000   
25%       0.000000     0.000000     0.000000      0.000000     5.000000  2007.000000   
50%       0.000000     0.000000     0.000000      0.000000     6.000000  2008.000000   
75%       0.000000     0.000000     0.000000      0.000000     8.000000  2009.000000   
max     508.000000   480.000000   738.000000  15500.000000    12.000000  2010.000000   

           SalePrice  
count    1460.000000  
mean   180921.195890  
std     79442.502883  
min     34900.000000  
25%    129975.000000  
50%    163000.000000  
75%    214000.000000  
max    755000.000000  

Train Data Info:
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 1460 entries, 0 to 1459
Data columns (total 81 columns):
 #   Column         Non-Null Count  Dtype  
---  ------         --------------  -----  
 0   Id             1460 non-null   int64  
 1   MSSubClass     1460 non-null   int64  
 2   MSZoning       1460 non-null   object 
 3   LotFrontage    1201 non-null   float64
 4   LotArea        1460 non-null   int64  
 5   Street         1460 non-null   object 
 6   Alley          91 non-null     object 
 7   LotShape       1460 non-null   object 
 8   LandContour    1460 non-null   object 
 9   Utilities      1460 non-null   object 
 10  LotConfig      1460 non-null   object 
 11  LandSlope      1460 non-null   object 
 12  Neighborhood   1460 non-null   object 
 13  Condition1     1460 non-null   object 
 14  Condition2     1460 non-null   object 
 15  BldgType       1460 non-null   object 
 16  HouseStyle     1460 non-null   object 
 17  OverallQual    1460 non-null   int64  
 18  OverallCond    1460 non-null   int64  
 19  YearBuilt      1460 non-null   int64  
 20  YearRemodAdd   1460 non-null   int64  
 21  RoofStyle      1460 non-null   object 
 22  RoofMatl       1460 non-null   object 
 23  Exterior1st    1460 non-null   object 
 24  Exterior2nd    1460 non-null   object 
 25  MasVnrType     588 non-null    object 
 26  MasVnrArea     1452 non-null   float64
 27  ExterQual      1460 non-null   object 
 28  ExterCond      1460 non-null   object 
 29  Foundation     1460 non-null   object 
 30  BsmtQual       1423 non-null   object 
 31  BsmtCond       1423 non-null   object 
 32  BsmtExposure   1422 non-null   object 
 33  BsmtFinType1   1423 non-null   object 
 34  BsmtFinSF1     1460 non-null   int64  
 35  BsmtFinType2   1422 non-null   object 
 36  BsmtFinSF2     1460 non-null   int64  
 37  BsmtUnfSF      1460 non-null   int64  
 38  TotalBsmtSF    1460 non-null   int64  
 39  Heating        1460 non-null   object 
 40  HeatingQC      1460 non-null   object 
 41  CentralAir     1460 non-null   object 
 42  Electrical     1459 non-null   object 
 43  1stFlrSF       1460 non-null   int64  
 44  2ndFlrSF       1460 non-null   int64  
 45  LowQualFinSF   1460 non-null   int64  
 46  GrLivArea      1460 non-null   int64  
 47  BsmtFullBath   1460 non-null   int64  
 48  BsmtHalfBath   1460 non-null   int64  
 49  FullBath       1460 non-null   int64  
 50  HalfBath       1460 non-null   int64  
 51  BedroomAbvGr   1460 non-null   int64  
 52  KitchenAbvGr   1460 non-null   int64  
 53  KitchenQual    1460 non-null   object 
 54  TotRmsAbvGrd   1460 non-null   int64  
 55  Functional     1460 non-null   object 
 56  Fireplaces     1460 non-null   int64  
 57  FireplaceQu    770 non-null    object 
 58  GarageType     1379 non-null   object 
 59  GarageYrBlt    1379 non-null   float64
 60  GarageFinish   1379 non-null   object 
 61  GarageCars     1460 non-null   int64  
 62  GarageArea     1460 non-null   int64  
 63  GarageQual     1379 non-null   object 
 64  GarageCond     1379 non-null   object 
 65  PavedDrive     1460 non-null   object 
 66  WoodDeckSF     1460 non-null   int64  
 67  OpenPorchSF    1460 non-null   int64  
 68  EnclosedPorch  1460 non-null   int64  
 69  3SsnPorch      1460 non-null   int64  
 70  ScreenPorch    1460 non-null   int64  
 71  PoolArea       1460 non-null   int64  
 72  PoolQC         7 non-null      object 
 73  Fence          281 non-null    object 
 74  MiscFeature    54 non-null     object 
 75  MiscVal        1460 non-null   int64  
 76  MoSold         1460 non-null   int64  
 77  YrSold         1460 non-null   int64  
 78  SaleType       1460 non-null   object 
 79  SaleCondition  1460 non-null   object 
 80  SalePrice      1460 non-null   int64  
dtypes: float64(3), int64(35), object(43)
memory usage: 924.0+ KB
None
InΒ [31]:
# Output columns of train and test data
print("Train Data Columns:")
print(train_data.columns)

print("\nTest Data Columns:")
print(test_data.columns)
Train Data Columns:
Index(['Id', 'MSSubClass', 'MSZoning', 'LotFrontage', 'LotArea', 'Street', 'Alley', 'LotShape',
       'LandContour', 'Utilities', 'LotConfig', 'LandSlope', 'Neighborhood', 'Condition1',
       'Condition2', 'BldgType', 'HouseStyle', 'OverallQual', 'OverallCond', 'YearBuilt',
       'YearRemodAdd', 'RoofStyle', 'RoofMatl', 'Exterior1st', 'Exterior2nd', 'MasVnrType',
       'MasVnrArea', 'ExterQual', 'ExterCond', 'Foundation', 'BsmtQual', 'BsmtCond',
       'BsmtExposure', 'BsmtFinType1', 'BsmtFinSF1', 'BsmtFinType2', 'BsmtFinSF2', 'BsmtUnfSF',
       'TotalBsmtSF', 'Heating', 'HeatingQC', 'CentralAir', 'Electrical', '1stFlrSF', '2ndFlrSF',
       'LowQualFinSF', 'GrLivArea', 'BsmtFullBath', 'BsmtHalfBath', 'FullBath', 'HalfBath',
       'BedroomAbvGr', 'KitchenAbvGr', 'KitchenQual', 'TotRmsAbvGrd', 'Functional', 'Fireplaces',
       'FireplaceQu', 'GarageType', 'GarageYrBlt', 'GarageFinish', 'GarageCars', 'GarageArea',
       'GarageQual', 'GarageCond', 'PavedDrive', 'WoodDeckSF', 'OpenPorchSF', 'EnclosedPorch',
       '3SsnPorch', 'ScreenPorch', 'PoolArea', 'PoolQC', 'Fence', 'MiscFeature', 'MiscVal',
       'MoSold', 'YrSold', 'SaleType', 'SaleCondition', 'SalePrice'],
      dtype='object')

Test Data Columns:
Index(['Id', 'MSSubClass', 'MSZoning', 'LotFrontage', 'LotArea', 'Street', 'Alley', 'LotShape',
       'LandContour', 'Utilities', 'LotConfig', 'LandSlope', 'Neighborhood', 'Condition1',
       'Condition2', 'BldgType', 'HouseStyle', 'OverallQual', 'OverallCond', 'YearBuilt',
       'YearRemodAdd', 'RoofStyle', 'RoofMatl', 'Exterior1st', 'Exterior2nd', 'MasVnrType',
       'MasVnrArea', 'ExterQual', 'ExterCond', 'Foundation', 'BsmtQual', 'BsmtCond',
       'BsmtExposure', 'BsmtFinType1', 'BsmtFinSF1', 'BsmtFinType2', 'BsmtFinSF2', 'BsmtUnfSF',
       'TotalBsmtSF', 'Heating', 'HeatingQC', 'CentralAir', 'Electrical', '1stFlrSF', '2ndFlrSF',
       'LowQualFinSF', 'GrLivArea', 'BsmtFullBath', 'BsmtHalfBath', 'FullBath', 'HalfBath',
       'BedroomAbvGr', 'KitchenAbvGr', 'KitchenQual', 'TotRmsAbvGrd', 'Functional', 'Fireplaces',
       'FireplaceQu', 'GarageType', 'GarageYrBlt', 'GarageFinish', 'GarageCars', 'GarageArea',
       'GarageQual', 'GarageCond', 'PavedDrive', 'WoodDeckSF', 'OpenPorchSF', 'EnclosedPorch',
       '3SsnPorch', 'ScreenPorch', 'PoolArea', 'PoolQC', 'Fence', 'MiscFeature', 'MiscVal',
       'MoSold', 'YrSold', 'SaleType', 'SaleCondition'],
      dtype='object')

Quick basic data preprocessing before EDAΒΆ

InΒ [32]:
# Check for missing values
print("\nMissing Values:")
print(train_data.isnull().sum())
Missing Values:
Id                 0
MSSubClass         0
MSZoning           0
LotFrontage      259
LotArea            0
                ... 
MoSold             0
YrSold             0
SaleType           0
SaleCondition      0
SalePrice          0
Length: 81, dtype: int64
InΒ [33]:
# Fill missing values
for col in train_data.select_dtypes(include=[np.number]).columns:
    train_data[col].fillna(train_data[col].median(), inplace=True)

for col in train_data.select_dtypes(include=['object']).columns:
    train_data[col].fillna(train_data[col].mode()[0], inplace=True)

EDA - Essential Data AnalysisΒΆ

InΒ [34]:
# Visualize the distribution of SalePrice
sns.histplot(train_data['SalePrice'], kde=True)
plt.title('Distribution of SalePrice')
plt.xlabel('SalePrice')
plt.ylabel('Frequency')
plt.show()
No description has been provided for this image

The distribution of SalePrice is right-skewed, indicating that most houses are priced lower, with fewer high-priced houses. For an SGD/linear regression model, this skewness might lead to suboptimal performance due to the influence of outliers. we will explore 5 different transforms for which we will normalize the saleprice data distribution: log transform,sqaure root transform,box-cox transform,yeo-johnson transorm,rank transform. more information on box-cox and yeo-johnson transforms here:

InΒ [35]:
train_data['LogSalePrice'] = np.log1p(train_data['SalePrice'])
train_data['SqrtSalePrice'] = np.sqrt(train_data['SalePrice'])
train_data['BoxCoxSalePrice'], _ = stats.boxcox(train_data['SalePrice'])
pt = PowerTransformer(method='yeo-johnson')
train_data['YeoJohnsonSalePrice'] = pt.fit_transform(train_data[['SalePrice']])
qt = QuantileTransformer(output_distribution='normal')
train_data['RankSalePrice'] = qt.fit_transform(train_data[['SalePrice']])
transformations = ['LogSalePrice', 'SqrtSalePrice', 'BoxCoxSalePrice', 'YeoJohnsonSalePrice', 'RankSalePrice']

plt.figure(figsize=(15, 10))
for i, col in enumerate(transformations, 1):
    plt.subplot(3, 2, i)
    sns.histplot(train_data[col], kde=True)
    plt.title(f'Distribution of {col}')
    plt.xlabel(col)
    plt.ylabel('Frequency')

plt.tight_layout()
plt.show()

# Encode categorical variables
label_encoders = {}
for col in train_data.select_dtypes(include=['object']).columns:
    le = LabelEncoder()
    train_data[col] = le.fit_transform(train_data[col].astype(str))
    label_encoders[col] = le
# Prepare the data
X = train_data.drop(['SalePrice', 'LogSalePrice', 'SqrtSalePrice', 'BoxCoxSalePrice', 'YeoJohnsonSalePrice', 'RankSalePrice'], axis=1)
# Function to evaluate models
def evaluate_model(target_column):
    y = train_data[target_column]
    X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2, random_state=42)
    model = LinearRegression()
    model.fit(X_train, y_train)
    y_pred = model.predict(X_val)
    rmse = np.sqrt(mean_squared_error(y_val, y_pred))
    return rmse

# List of transformations
transformations = ['LogSalePrice', 'SqrtSalePrice', 'BoxCoxSalePrice', 'YeoJohnsonSalePrice', 'RankSalePrice']

# Evaluate each transformation
results = {}
for col in transformations:
    rmse = evaluate_model(col)
    results[col] = rmse

print("RMSE for each transformation:")
for key, value in results.items():
    print(f"{key}: {value}")
No description has been provided for this image
RMSE for each transformation:
LogSalePrice: 0.1553014163275464
SqrtSalePrice: 32.53012182031111
BoxCoxSalePrice: 0.06284945758572007
YeoJohnsonSalePrice: 0.39726995853079683
RankSalePrice: 0.37999784180570645

By selecting the Box-Cox transformed SalePrice, we leverage the transformation that provided the best performance (lowest RMSE) and ensure the preprocessing and model training steps align with this choice. This approach maximizes the model's accuracy and robustness.

Relationships between features and SalePrice Before Box-Cox Transformation(Power transform with [0,1] domain

InΒ [36]:
# Visualize relationships between features and SalePrice
sns.boxplot(x='OverallQual', y='SalePrice', data=train_data)
plt.title('SalePrice vs. Overall Quality')
plt.xlabel('Overall Quality')
plt.ylabel('SalePrice')
plt.show()

sns.scatterplot(x='GrLivArea', y='SalePrice', data=train_data)
plt.title('SalePrice vs. Above Grade (Ground) Living Area')
plt.xlabel('Above Grade Living Area (GrLivArea)')
plt.ylabel('SalePrice')
plt.show()

sns.lineplot(x='YearBuilt', y='SalePrice', data=train_data)
plt.title('SalePrice vs. Year Built')
plt.xlabel('Year Built')
plt.ylabel('SalePrice')
plt.show()
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

After Box-Cox Transformation

InΒ [37]:
# Apply Box-Cox Transformation
train_data['BoxCoxSalePrice'], _ = stats.boxcox(train_data['SalePrice'])

# Visualize relationships between features and Box-Cox transformed SalePrice
plt.figure(figsize=(15, 5))

plt.subplot(1, 3, 1)
sns.boxplot(x='OverallQual', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Overall Quality')
plt.xlabel('Overall Quality')
plt.ylabel('Box-Cox SalePrice')

plt.subplot(1, 3, 2)
sns.scatterplot(x='GrLivArea', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Above Grade (Ground) Living Area')
plt.xlabel('Above Grade Living Area (GrLivArea)')
plt.ylabel('Box-Cox SalePrice')

plt.subplot(1, 3, 3)
sns.lineplot(x='YearBuilt', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Year Built')
plt.xlabel('Year Built')
plt.ylabel('Box-Cox SalePrice')

plt.tight_layout()
plt.show()
No description has been provided for this image
InΒ [38]:
# Define the reg_plot function
def reg_plot(df, column):
    numeric_cols = df.select_dtypes(include=[np.number])
    for col in numeric_cols.columns:
        if col != column:
            plt.figure(figsize=(8, 6))
            sns.regplot(x=numeric_cols[col], y=numeric_cols[column])
            plt.title(f'Regression plot of {col} vs {column}')
            plt.ylabel(column)
            plt.xlabel(col)
            plt.show()

# Regression plots for Box-Cox Transformed SalePrice
reg_plot(train_data, 'BoxCoxSalePrice')
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

Box-Cox transformation significantly normalizes the SalePrice distribution. Reduced RMSE values indicate improved model performance. Enhanced linear relationships between features and target variable.

Complete Data PreprocessingΒΆ

InΒ [46]:
# Load the dataset
train_data = pd.read_csv('train.csv')
test_data = pd.read_csv('test.csv')

# Fill missing values
for col in train_data.select_dtypes(include=[np.number]).columns:
    train_data[col].fillna(train_data[col].median(), inplace=True)
    if col in test_data.columns:
        test_data[col].fillna(train_data[col].median(), inplace=True)

for col in train_data.select_dtypes(include=['object']).columns:
    train_data[col].fillna(train_data[col].mode()[0], inplace=True)
    if col in test_data.columns:
        test_data[col].fillna(test_data[col].mode()[0], inplace=True)

# Encode categorical variables
label_encoders = {}
for col in train_data.select_dtypes(include=['object']).columns:
    le = LabelEncoder()
    train_data[col] = le.fit_transform(train_data[col].astype(str))
    if col in test_data.columns:
        test_data[col] = le.transform(test_data[col].astype(str))
    label_encoders[col] = le

# Apply Box-Cox Transformation
train_data['BoxCoxSalePrice'], _ = stats.boxcox(train_data['SalePrice'])

# Prepare the data
X_train = train_data.drop(['SalePrice', 'BoxCoxSalePrice'], axis=1)
y_train = train_data['BoxCoxSalePrice']
X_test = test_data.copy()

# Ensure the test data has the same columns as the training data
X_test = X_test[X_train.columns]

# Standardize the features
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)

# Adding new features
train_data['YrBltAndRemod'] = train_data['YearBuilt'] + train_data['YearRemodAdd']
train_data['TotalSF'] = train_data['TotalBsmtSF'] + train_data['1stFlrSF'] + train_data['2ndFlrSF']
train_data['Total_sqr_footage'] = (train_data['BsmtFinSF1'] + train_data['BsmtFinSF2'] +
                                   train_data['1stFlrSF'] + train_data['2ndFlrSF'])
train_data['Total_Bathrooms'] = (train_data['FullBath'] + (0.5 * train_data['HalfBath']) +
                                 train_data['BsmtFullBath'] + (0.5 * train_data['BsmtHalfBath']))
train_data['Total_porch_sf'] = (train_data['OpenPorchSF'] + train_data['3SsnPorch'] +
                                train_data['EnclosedPorch'] + train_data['ScreenPorch'] +
                                train_data['WoodDeckSF'])

Feature engineering is essential even after applying the Box-Cox transformation. It helps in creating new features that can capture additional information and improve model performance. From the visualizations, new features like TotalSF, HouseAge, and TotalBath can provide better insights and stronger relationships with the transformed target variable (Box-Cox SalePrice).

InΒ [40]:
# Visualize new features
plt.figure(figsize=(15, 5))

plt.subplot(1, 3, 1)
sns.scatterplot(x='TotalSF', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Total Square Footage')
plt.xlabel('Total Square Footage')
plt.ylabel('Box-Cox SalePrice')

plt.subplot(1, 3, 2)
sns.scatterplot(x='YearBuilt', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Year Built')
plt.xlabel('Year Built')
plt.ylabel('Box-Cox SalePrice')

plt.subplot(1, 3, 3)
sns.scatterplot(x='Total_Bathrooms', y='BoxCoxSalePrice', data=train_data)
plt.title('Box-Cox SalePrice vs. Total Bathrooms')
plt.xlabel('Total Bathrooms')
plt.ylabel('Box-Cox SalePrice')

plt.tight_layout()
plt.show()
No description has been provided for this image

First, the dataset is loaded, and missing values in numerical columns are filled with their respective medians, while missing values in categorical columns are filled with their most frequent values. Next, categorical variables are encoded using LabelEncoder, transforming text labels into numeric values for power transform up next. A Box-Cox transformation is then applied to the target variable ('SalePrice') for normality. Subsequently, the features are prepared by dropping the target columns from the training data, ensuring that the test data has matching columns. Finally, the features are scaled using StandardScaler to standardize the data.

InΒ [41]:
# Create correlation heatmap
numeric_cols = X_train.copy()
numeric_cols['BoxCoxSalePrice'] = y_train

plt.figure(figsize=(50, 50))
heatmap = sns.heatmap(numeric_cols.corr(), annot=True, fmt=".1f", cmap='coolwarm', linewidths=1)
heatmap.set_title('Correlation Heatmap (Box-Cox SalePrice)')
plt.show()
No description has been provided for this image

The correlation heatmap shows strong positive correlations between BoxCoxSalePrice and features like OverallQual, GrLivArea, GarageCars, and TotalBsmtSF. These features are crucial predictors for house prices, indicating higher quality, larger living areas, and more garage space are strongly linked to higher sale prices.

InΒ [42]:
# Train and evaluate Linear Regression model
lr_model = LinearRegression()
lr_model.fit(X_train_scaled, y_train)
y_train_pred_lr = lr_model.predict(X_train_scaled)
train_rmse_lr = np.sqrt(mean_squared_error(y_train, y_train_pred_lr))

# Train and evaluate SGDRegressor model
sgd_model = SGDRegressor(max_iter=1000, tol=1e-3, random_state=42)
sgd_model.fit(X_train_scaled, y_train)
y_train_pred_sgd = sgd_model.predict(X_train_scaled)
train_rmse_sgd = np.sqrt(mean_squared_error(y_train, y_train_pred_sgd))

print(f'Linear Regression RMSE on Training Set: {train_rmse_lr}')
print(f'SGD Regressor RMSE on Training Set: {train_rmse_sgd}')

# Select the better model
best_model = lr_model if train_rmse_lr < train_rmse_sgd else sgd_model

# Predict on the test set using the best model
test_pred = best_model.predict(X_test_scaled)

# Handle infinite values
test_pred = np.where(np.isinf(test_pred), np.nan, test_pred)
test_pred = np.where(np.isnan(test_pred), np.nanmedian(test_pred), test_pred)

# Transform predictions back to the original scale
test_pred_original_scale = np.expm1(test_pred)  # Inverse of Box-Cox transformation

# Ensure no infinite or NaN values in the predictions
test_pred_original_scale = np.where(np.isinf(test_pred_original_scale), np.nan, test_pred_original_scale)
test_pred_original_scale = np.where(np.isnan(test_pred_original_scale), np.nanmedian(test_pred_original_scale), test_pred_original_scale)

# Prepare submission file
submission = pd.DataFrame({
    'Id': test_data['Id'],
    'SalePrice': test_pred_original_scale
})
submission.to_csv('submission.csv', index=False)
print('Submission file created successfully!')
Linear Regression RMSE on Training Set: 0.053512327205215385
SGD Regressor RMSE on Training Set: 0.05457415118047126
Submission file created successfully!
<ipython-input-42-ef42248a86e2>:27: RuntimeWarning: overflow encountered in expm1
  test_pred_original_scale = np.expm1(test_pred)  # Inverse of Box-Cox transformation

a Linear Regression model is trained and evaluated on the scaled training data, with the RMSE calculated for performance. Similarly, an SGDRegressor model is trained and evaluated, and the model with the lower RMSE is selected as the best model. This best model is then used to predict on the scaled test data, with infinite values handled and predictions transformed back to the original scale using the inverse Box-Cox transformation. Finally, a submission file is prepared with the test data IDs and the predicted sale prices, ensuring no infinite or NaN values, and saved as 'submission.csv'.

first try: lowest rmse on training - 0.045



image.png

InΒ [43]:
# Split the training data into training and validation sets
X_train, X_val, y_train, y_val = train_test_split(X_train_scaled, y_train, test_size=0.2, random_state=42)

# Define LOOCV
loo = LeaveOneOut()

# Define LassoCV model with LOOCV
alphas = np.logspace(-4, 0, 50)
lasso_model = make_pipeline(RobustScaler(), LassoCV(alphas=alphas, max_iter=10000, random_state=42, cv=loo))

# Fit the model
lasso_model.fit(X_train, y_train)

# Predict on the training set
y_train_pred_lasso = lasso_model.predict(X_train)
train_rmse_lasso = np.sqrt(mean_squared_error(y_train, y_train_pred_lasso))
train_r2_lasso = r2_score(y_train, y_train_pred_lasso)
train_mae_lasso = mean_absolute_error(y_train, y_train_pred_lasso)

n = len(y_train)
p = X_train.shape[1]
train_adj_r2_lasso = 1 - (1 - train_r2_lasso) * (n - 1) / (n - p - 1)

print(f'Lasso Regression RMSE on Training Set: {train_rmse_lasso}')
print(f'Lasso Regression RΒ² on Training Set: {train_r2_lasso}')
print(f'Lasso Regression Adjusted RΒ² on Training Set: {train_adj_r2_lasso}')
print(f'Lasso Regression MAE on Training Set: {train_mae_lasso}')

# Predict on the validation set
y_val_pred_lasso = lasso_model.predict(X_val)
val_rmse_lasso = np.sqrt(mean_squared_error(y_val, y_val_pred_lasso))
val_r2_lasso = r2_score(y_val, y_val_pred_lasso)
val_mae_lasso = mean_absolute_error(y_val, y_val_pred_lasso)

n_val = len(y_val)
val_adj_r2_lasso = 1 - (1 - val_r2_lasso) * (n_val - 1) / (n_val - p - 1)

print(f'Lasso Regression RMSE on Validation Set: {val_rmse_lasso}')
print(f'Lasso Regression RΒ² on Validation Set: {val_r2_lasso}')
print(f'Lasso Regression Adjusted RΒ² on Validation Set: {val_adj_r2_lasso}')
print(f'Lasso Regression MAE on Validation Set: {val_mae_lasso}')

# Predict on the test set
test_pred_lasso = lasso_model.predict(X_test_scaled)

# Handle infinite values
test_pred_lasso = np.where(np.isinf(test_pred_lasso), np.nan, test_pred_lasso)
test_pred_lasso = np.where(np.isnan(test_pred_lasso), np.nanmedian(test_pred_lasso), test_pred_lasso)

# Transform predictions back to the original scale
test_pred_lasso_original_scale = np.expm1(test_pred_lasso)  # Inverse of Box-Cox transformation

# Ensure no infinite or NaN values in the predictions
test_pred_lasso_original_scale = np.where(np.isinf(test_pred_lasso_original_scale), np.nan, test_pred_lasso_original_scale)
test_pred_lasso_original_scale = np.where(np.isnan(test_pred_lasso_original_scale), np.nanmedian(test_pred_lasso_original_scale), test_pred_lasso_original_scale)

# Prepare submission file
submission_lasso = pd.DataFrame({
    'Id': test_data['Id'],
    'SalePrice': test_pred_lasso_original_scale
})
submission_lasso.to_csv('submission_lasso.csv', index=False)
print('Lasso Submission file created successfully!')
Lasso Regression RMSE on Training Set: 0.05716019112185389
Lasso Regression RΒ² on Training Set: 0.863293890306751
Lasso Regression Adjusted RΒ² on Training Set: 0.853232723080017
Lasso Regression MAE on Training Set: 0.037538483804294955
Lasso Regression RMSE on Validation Set: 0.06367451350179677
Lasso Regression RΒ² on Validation Set: 0.86205885684438
Lasso Regression Adjusted RΒ² on Validation Set: 0.8097588973540975
Lasso Regression MAE on Validation Set: 0.04327726848974205
Lasso Submission file created successfully!

to execute this code, run preprocess step again.

InΒ [47]:
def objective(trial):
    alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
    lasso = make_pipeline(RobustScaler(), LassoCV(alphas=[alpha], max_iter=10000, random_state=42, cv=loo))

    # Fit the model
    lasso.fit(X_train_scaled, y_train)

    # Predict on the training set
    y_train_pred = lasso.predict(X_train_scaled)
    train_rmse = np.sqrt(mean_squared_error(y_train, y_train_pred))

    return train_rmse

# Create a study and optimize the objective function
study = optuna.create_study(direction='minimize')
study.optimize(objective, n_trials=100)

# Get the best hyperparameters
best_alpha = study.best_params['alpha']
print(f'Best alpha: {best_alpha}')

# Train the final model with the best hyperparameters
final_lasso_model = make_pipeline(RobustScaler(), LassoCV(alphas=[best_alpha], max_iter=10000, random_state=42, cv=loo))
final_lasso_model.fit(X_train_scaled, y_train)

# Predict on the test set
test_pred_final_lasso = final_lasso_model.predict(X_test_scaled)

# Handle infinite values
test_pred_final_lasso = np.where(np.isinf(test_pred_final_lasso), np.nan, test_pred_final_lasso)
test_pred_final_lasso = np.where(np.isnan(test_pred_final_lasso), np.nanmedian(test_pred_final_lasso), test_pred_final_lasso)

# Transform predictions back to the original scale
test_pred_final_lasso_original_scale = np.expm1(test_pred_final_lasso)  # Inverse of Box-Cox transformation

# Ensure no infinite or NaN values in the predictions
test_pred_final_lasso_original_scale = np.where(np.isinf(test_pred_final_lasso_original_scale), np.nan, test_pred_final_lasso_original_scale)
test_pred_final_lasso_original_scale = np.where(np.isnan(test_pred_final_lasso_original_scale), np.nanmedian(test_pred_final_lasso_original_scale), test_pred_final_lasso_original_scale)

# Prepare submission file
submission_final_lasso = pd.DataFrame({
    'Id': test_data['Id'],
    'SalePrice': test_pred_final_lasso_original_scale
})
submission_final_lasso.to_csv('submission_final_lasso.csv', index=False)
print('Final Lasso Submission file created successfully!')
[I 2024-07-08 15:10:25,679] A new study created in memory with name: no-name-53512fa7-f41f-4937-85bb-0b7389b927e2
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:10:40,107] Trial 0 finished with value: 0.053600281400385016 and parameters: {'alpha': 0.0002373628492740164}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:10:50,405] Trial 1 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.1882964770684149}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:06,120] Trial 2 finished with value: 0.05360331109073908 and parameters: {'alpha': 0.0002415738899957665}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:15,413] Trial 3 finished with value: 0.05387823625109227 and parameters: {'alpha': 0.0005054984430503233}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:23,720] Trial 4 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.10521513257204362}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:31,860] Trial 5 finished with value: 0.05367582983719793 and parameters: {'alpha': 0.0003265928593629117}. Best is trial 0 with value: 0.053600281400385016.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:42,632] Trial 6 finished with value: 0.053590175344318626 and parameters: {'alpha': 0.00022273750356348706}. Best is trial 6 with value: 0.053590175344318626.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:51,302] Trial 7 finished with value: 0.06643625485545294 and parameters: {'alpha': 0.010710814830114745}. Best is trial 6 with value: 0.053590175344318626.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:11:57,749] Trial 8 finished with value: 0.06645591967209519 and parameters: {'alpha': 0.010726915241263303}. Best is trial 6 with value: 0.053590175344318626.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:06,440] Trial 9 finished with value: 0.05859062467178115 and parameters: {'alpha': 0.004178251786124619}. Best is trial 6 with value: 0.053590175344318626.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:12,656] Trial 10 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.8826362732023216}. Best is trial 6 with value: 0.053590175344318626.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:25,816] Trial 11 finished with value: 0.05353036175739292 and parameters: {'alpha': 0.00010601065079441708}. Best is trial 11 with value: 0.05353036175739292.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:34,818] Trial 12 finished with value: 0.055792444008499135 and parameters: {'alpha': 0.0019850870827978198}. Best is trial 11 with value: 0.05353036175739292.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:41,748] Trial 13 finished with value: 0.0555761640075 and parameters: {'alpha': 0.0018282087234327187}. Best is trial 11 with value: 0.05353036175739292.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:12:54,961] Trial 14 finished with value: 0.053529404135957255 and parameters: {'alpha': 0.0001031167650358191}. Best is trial 14 with value: 0.053529404135957255.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:08,360] Trial 15 finished with value: 0.053529199888957074 and parameters: {'alpha': 0.00010247027249755223}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:17,898] Trial 16 finished with value: 0.05422500079321876 and parameters: {'alpha': 0.0007876197578625504}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:30,966] Trial 17 finished with value: 0.05353016928372587 and parameters: {'alpha': 0.00010544622383525156}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:37,155] Trial 18 finished with value: 0.09140149304259253 and parameters: {'alpha': 0.03498528674269773}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:45,919] Trial 19 finished with value: 0.05620864610481257 and parameters: {'alpha': 0.002309449410979086}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:13:52,940] Trial 20 finished with value: 0.05438998400332927 and parameters: {'alpha': 0.0009347482350332158}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:05,763] Trial 21 finished with value: 0.05353315138041949 and parameters: {'alpha': 0.00011409909202669533}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:18,634] Trial 22 finished with value: 0.05353126836395681 and parameters: {'alpha': 0.00010870396671642767}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:27,989] Trial 23 finished with value: 0.05406060632654284 and parameters: {'alpha': 0.0006428507960496405}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:38,905] Trial 24 finished with value: 0.05357036453085148 and parameters: {'alpha': 0.00019168892115027216}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:46,594] Trial 25 finished with value: 0.05387951097546848 and parameters: {'alpha': 0.0005065707802387852}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:14:55,254] Trial 26 finished with value: 0.059557704902662476 and parameters: {'alpha': 0.0049665781775936205}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:02,353] Trial 27 finished with value: 0.054612530288614965 and parameters: {'alpha': 0.001112830738165773}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:12,577] Trial 28 finished with value: 0.05370023044914137 and parameters: {'alpha': 0.00035062199655904586}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:23,988] Trial 29 finished with value: 0.05356638260013436 and parameters: {'alpha': 0.0001849070457225202}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:35,691] Trial 30 finished with value: 0.05355170324807586 and parameters: {'alpha': 0.00015760089151028772}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:47,129] Trial 31 finished with value: 0.053534307996424245 and parameters: {'alpha': 0.0001172656041652981}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:15:59,626] Trial 32 finished with value: 0.0535301393026804 and parameters: {'alpha': 0.0001053504345592639}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:09,898] Trial 33 finished with value: 0.05366893723982734 and parameters: {'alpha': 0.00031948956127232105}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:20,119] Trial 34 finished with value: 0.05367724557659707 and parameters: {'alpha': 0.000328041228427731}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:30,700] Trial 35 finished with value: 0.053573882247554154 and parameters: {'alpha': 0.0001975342501572628}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:37,531] Trial 36 finished with value: 0.09044684698214336 and parameters: {'alpha': 0.03410381867804417}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:47,395] Trial 37 finished with value: 0.05380599926818146 and parameters: {'alpha': 0.0004449122971831249}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:16:55,752] Trial 38 finished with value: 0.05362879959452767 and parameters: {'alpha': 0.0002744643491212742}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:17:04,130] Trial 39 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.3360212552449747}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:17:12,192] Trial 40 finished with value: 0.0548608646094081 and parameters: {'alpha': 0.0012969204229047032}. Best is trial 15 with value: 0.053529199888957074.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:17:24,883] Trial 41 finished with value: 0.05352873354572842 and parameters: {'alpha': 0.00010101700501005106}. Best is trial 41 with value: 0.05352873354572842.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:17:37,971] Trial 42 finished with value: 0.053528466135379885 and parameters: {'alpha': 0.00010016550969883015}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:17:49,529] Trial 43 finished with value: 0.0535653190507294 and parameters: {'alpha': 0.00018305537271135194}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:01,580] Trial 44 finished with value: 0.053552363566653216 and parameters: {'alpha': 0.00015891606896407162}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:11,319] Trial 45 finished with value: 0.053836005753865566 and parameters: {'alpha': 0.0004686408718489361}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:19,734] Trial 46 finished with value: 0.05362076706462809 and parameters: {'alpha': 0.0002645400907757118}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:32,972] Trial 47 finished with value: 0.05352942817614482 and parameters: {'alpha': 0.00010317152634893397}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:42,905] Trial 48 finished with value: 0.0540247839711391 and parameters: {'alpha': 0.0006176236980791251}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:18:49,981] Trial 49 finished with value: 0.08514183367664244 and parameters: {'alpha': 0.029341571084298107}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:19:01,554] Trial 50 finished with value: 0.05354652709771648 and parameters: {'alpha': 0.00014675597098677017}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:19:12,463] Trial 51 finished with value: 0.053588329679542135 and parameters: {'alpha': 0.00022000172598047755}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:19:24,950] Trial 52 finished with value: 0.05354032734074916 and parameters: {'alpha': 0.00013261235905780183}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:19:38,482] Trial 53 finished with value: 0.05352891263615161 and parameters: {'alpha': 0.000101566676522198}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:19:51,725] Trial 54 finished with value: 0.05352898603582869 and parameters: {'alpha': 0.00010181076155880009}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:02,171] Trial 55 finished with value: 0.053612023549616396 and parameters: {'alpha': 0.0002532967580055036}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:10,380] Trial 56 finished with value: 0.05374175558690872 and parameters: {'alpha': 0.0003895056021724334}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:22,165] Trial 57 finished with value: 0.053556473446644964 and parameters: {'alpha': 0.00016695226094166316}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:30,274] Trial 58 finished with value: 0.11669450292745552 and parameters: {'alpha': 0.05763590843673374}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:37,174] Trial 59 finished with value: 0.07470251128026413 and parameters: {'alpha': 0.01878780613475385}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:45,763] Trial 60 finished with value: 0.058952463915203736 and parameters: {'alpha': 0.004468029243719056}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:20:59,208] Trial 61 finished with value: 0.05352998438361559 and parameters: {'alpha': 0.00010487814271285051}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:21:12,420] Trial 62 finished with value: 0.05352905827924755 and parameters: {'alpha': 0.00010202643178757504}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:21:22,498] Trial 63 finished with value: 0.05354578225916267 and parameters: {'alpha': 0.00014512990585217143}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:21:33,060] Trial 64 finished with value: 0.05360503156955339 and parameters: {'alpha': 0.00024393377317383273}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:21:45,035] Trial 65 finished with value: 0.05354310412365094 and parameters: {'alpha': 0.00013913090917873927}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:21:56,202] Trial 66 finished with value: 0.053581315965382556 and parameters: {'alpha': 0.00020937534208484722}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:22:09,637] Trial 67 finished with value: 0.05352891264364907 and parameters: {'alpha': 0.00010156670156105333}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:22:16,061] Trial 68 finished with value: 0.06293171264431961 and parameters: {'alpha': 0.007553269721161362}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:22:25,803] Trial 69 finished with value: 0.0540589310372409 and parameters: {'alpha': 0.0006416943284741338}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:22:35,513] Trial 70 finished with value: 0.05366467236131221 and parameters: {'alpha': 0.0003150109831215525}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:22:46,558] Trial 71 finished with value: 0.05354038105657566 and parameters: {'alpha': 0.00013274619558419925}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:00,046] Trial 72 finished with value: 0.05352860795942982 and parameters: {'alpha': 0.00010061955820473852}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:11,459] Trial 73 finished with value: 0.05356292181345095 and parameters: {'alpha': 0.0001788470216893848}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:24,889] Trial 74 finished with value: 0.05352849255177635 and parameters: {'alpha': 0.00010025455864284343}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:37,198] Trial 75 finished with value: 0.0535402752636842 and parameters: {'alpha': 0.00013248246808813432}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:43,652] Trial 76 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.939055100354887}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:23:54,337] Trial 77 finished with value: 0.05359197710286956 and parameters: {'alpha': 0.00022541869486269678}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:24:07,808] Trial 78 finished with value: 0.05352877781579703 and parameters: {'alpha': 0.0001011647977431113}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:24:16,163] Trial 79 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.17402485847751248}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:24:26,881] Trial 80 finished with value: 0.053540683946795176 and parameters: {'alpha': 0.00013347310982111645}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:24:39,815] Trial 81 finished with value: 0.05352931236056606 and parameters: {'alpha': 0.00010281620197964366}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:24:51,324] Trial 82 finished with value: 0.05355872119446525 and parameters: {'alpha': 0.00017118593465262374}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:04,693] Trial 83 finished with value: 0.05352907192807628 and parameters: {'alpha': 0.00010207166951171745}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:14,965] Trial 84 finished with value: 0.053645498202378045 and parameters: {'alpha': 0.0002940310314354055}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:25,686] Trial 85 finished with value: 0.05357490423783001 and parameters: {'alpha': 0.0001992027045295891}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:36,634] Trial 86 finished with value: 0.05353882745259628 and parameters: {'alpha': 0.00012896836150516267}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:48,388] Trial 87 finished with value: 0.053556118710599454 and parameters: {'alpha': 0.00016628273764972615}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:25:58,406] Trial 88 finished with value: 0.053751329917626156 and parameters: {'alpha': 0.0003982663771574504}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:26:06,609] Trial 89 finished with value: 0.05735651314431694 and parameters: {'alpha': 0.0032289606038883247}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:26:16,344] Trial 90 finished with value: 0.05360158972340862 and parameters: {'alpha': 0.00023918929882745063}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:26:29,366] Trial 91 finished with value: 0.05353093480425123 and parameters: {'alpha': 0.00010771704720757593}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:26:42,828] Trial 92 finished with value: 0.05352889220771959 and parameters: {'alpha': 0.00010152261482031507}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:26:55,149] Trial 93 finished with value: 0.05354008849409744 and parameters: {'alpha': 0.0001320407426020035}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:27:06,610] Trial 94 finished with value: 0.05355576607726485 and parameters: {'alpha': 0.00016558950760326392}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:27:19,376] Trial 95 finished with value: 0.05353398890018321 and parameters: {'alpha': 0.00011640406033965158}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:27:28,394] Trial 96 finished with value: 0.053575726235302865 and parameters: {'alpha': 0.00020052979765347698}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:27:40,247] Trial 97 finished with value: 0.05355098717958941 and parameters: {'alpha': 0.0001561333895128787}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:27:48,707] Trial 98 finished with value: 0.1582032951430112 and parameters: {'alpha': 0.5541632476436107}. Best is trial 42 with value: 0.053528466135379885.
<ipython-input-47-cc253058a798>:2: FutureWarning: suggest_loguniform has been deprecated in v3.0.0. This feature will be removed in v6.0.0. See https://github.com/optuna/optuna/releases/tag/v3.0.0. Use suggest_float(..., log=True) instead.
  alpha = trial.suggest_loguniform('alpha', 1e-4, 1.0)
[I 2024-07-08 15:28:01,350] Trial 99 finished with value: 0.053537503348076936 and parameters: {'alpha': 0.0001256458008209723}. Best is trial 42 with value: 0.053528466135379885.
Best alpha: 0.00010016550969883015
Final Lasso Submission file created successfully!

were ranked 4891/25000 submission for our first try which is not bad at all. nevertheless, we can improve using more techniques. for the second attempt we will add Cross validation and hyperparameter tuning using optuna

as we can see from the results when using lasso and loocv methods, and by adding hyperparemeter tuning by optuna we still got suboptimal results than first try.

image.png

ConclusionΒΆ

our first linear regressor after box-cox has gotten best result (4.035). using gradient decent/linear regresson methods has restricted our progress. we will try to use in later projects methods like gradient boosting, ensemble, blending different models, removing outliers, stacking and unsupervised learning.